James–Stein estimation of the first principal component

نویسندگان

چکیده

The Stein paradox has played an influential role in the field of high dimensional statistics. This result warns that sample mean, classically regarded as “usual estimator”, may be suboptimal dimensions. development James-Stein estimator, addresses this paradox, by now inspired a large literature on theme “shrinkage” In direction, we develop type estimator for first principal component dimension and low size data set. shrinks usual eigenvector covariance matrix under spiked model, yields superior asymptotic guarantees. Our derivation draws close connection to original formula so motivation recipe shrinkage is intuited natural way.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feature reduction of hyperspectral images: Discriminant analysis and the first principal component

When the number of training samples is limited, feature reduction plays an important role in classification of hyperspectral images. In this paper, we propose a supervised feature extraction method based on discriminant analysis (DA) which uses the first principal component (PC1) to weight the scatter matrices. The proposed method, called DA-PC1, copes with the small sample size problem and has...

متن کامل

feature reduction of hyperspectral images: discriminant analysis and the first principal component

when the number of training samples is limited, feature reduction plays an important role in classification of hyperspectral images. in this paper, we propose a supervised feature extraction method based on discriminant analysis (da) which uses the first principal component (pc1) to weight the scatter matrices. the proposed method, called da-pc1, copes with the small sample size problem and has...

متن کامل

Principal Component Projection Without Principal Component Analysis

We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependen...

متن کامل

Intrinsic dimension estimation of data by principal component analysis

Estimating intrinsic dimensionality of data is a classic problem in pattern recognition and statistics. Principal Component Analysis (PCA) is a powerful tool in discovering dimensionality of data sets with a linear structure; it, however, becomes ineffective when data have a nonlinear structure. In this paper, we propose a new PCA-based method to estimate intrinsic dimension of data with nonlin...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Stat

سال: 2022

ISSN: ['2049-1573']

DOI: https://doi.org/10.1002/sta4.419